Jan is an open-source software publisher whose flagship application, also named Jan, transforms any compatible computer into a local AI workstation by packaging a lightweight inference server with a polished desktop interface. Designed for privacy-minded developers, researchers, and advanced enthusiasts, Jan bundles multiple open-source large-language-model runtimes—such as llama.cpp and TensorRT—behind a single control panel, letting users download, switch, and quantize models without command-line tinkering. Typical use cases include offline chat assistants, codebase summarization, document Q&A, transcription pipelines, and automated content generation, all executed on the GPU or CPU without external API calls. The software exposes an OpenAI-compatible localhost endpoint, so existing tools like继续脚本, IDE extensions, or web front-ends can migrate seamlessly to a fully local backend. Model management features handle GGUF, GPTQ, and ONNX formats, while built-in telemetry and update checks remain opt-in to preserve user anonymity. Hardware acceleration is auto-detected for CUDA, ROCm, Apple Metal, and Intel Arc, ensuring consistent performance across Windows, macOS, and Linux. By wrapping complex inference dependencies into a single downloadable bundle, Jan effectively democratizes on-device AI experimentation for data scientists, privacy officers, and hobbyists who prefer to keep prompts, weights, and outputs within their own machines. Jan’s software is available for free on get.nero.com, where the latest Windows builds are delivered through trusted package sources such as winget, support batch installation alongside other applications, and are kept automatically up to date.

Jan

Turn your computer into an AI computer

Details